(+234)906 6787 765     |      prince@gmail.com

FINGERPRINT BIOMETRICS ATTENDANCE SYSTEM

1-5 Chapters
Simple Percentage
NGN 4000

Background to the project: While not prehistoric, biometrics have been around for thousands of years. Biometrics have progressed from crude means of categorization to being authenticators of identification employing a variety of modalities throughout the previous few millennia. So, let us take a look back in time to see where biometrics has been and how far we've gone. (El-Abed, 2012). While the oldest descriptions of biometrics may be traced back to 500BC in the Babylonian kingdom, the first record of a biometric identifying system was in the 1800s in Paris, France. Alphonse Bertillon devised a technique of unique body measures for the categorization and comparison of convicts. While this technique was far from flawless, it was the first to use unique biological traits to authenticate. (Admin, 2021)

Fingerprinting followed suit in the 1880s, not only as a method of identifying criminals but also as a type of signature on contracts. It was realized that a fingerprint was a marker of a person's identity and that it might be used to hold someone accountable. While there are disagreements about who first used fingerprints for identification, Edward Henry is known for developing a fingerprinting standard known as the Henry Classification System. (Admin, 2021)

This was the first fingerprint-based identification system. The methodology was soon embraced by law enforcement, replacing Bertillon's methods as the norm for criminal identification. This sparked a century of research into what additional distinct physiological traits may be exploited for identification. (Admin, 2021). Biometrics evolved tremendously as a subject of study throughout the next century. There were so many breakthroughs in the 1900s that it would be impossible to mention them all, so here are the highlights from the second half of the century.

Semi-automated facial recognition technologies were created in the 1960s, needing administrators to examine facial characteristics inside an image and extract useful feature points. Much more manual than the ones we may use to access our phones. By 1969, fingerprint and face recognition had become so common in law enforcement that the FBI allocated funds to the development of automated methods. This sparked the development of increasingly advanced sensors for biometric capture and data extraction. In the 1980s, the National Institute of Standards and Technology established a Voice group to explore and advance speech recognition technology processes. These experiments served as the foundation for today's voice command and recognition systems. In 1985, the idea that irises, like fingerprints, were unique to each individual was postulated, and by 1994, the first iris identification algorithm had been copyrighted. Furthermore, it was revealed that blood vessel patterns in the eyes were unique to each individual and could be utilized for authentication. In 1991, facial detection technology was invented, allowing for real-time recognition. While these techniques had numerous flaws, they sparked a surge of interest in facial recognition research.

Hundreds of biometric authentication recognition algorithms were functioning and patented in the United States by the 2000s. Biometrics were no longer being used just in huge corporations or the government. They were marketed commercially and used at large-scale events such as the 2001 Super Bowl. Biometric technology research has advanced at a remarkable pace in the last ten years alone. Biometrics has progressed from a revolutionary technology to an integral element of daily life. In 2013, Apple added fingerprint recognition to the iPhone, ushering in widespread acceptance of biometric identification. Most smartphones now have biometric capabilities, and many applications employ biometrics as an authenticator for common activities.

Even with all of the progress, the development possibilities of biometric authentication and identity are far from exhausted. As biometrics research advances, we may expect it to be combined with artificial intelligence. The goal is to create biometric devices and systems that can learn and adapt to their users. Creating a smooth authenticating experience. As biometrics become increasingly ubiquitous, the usage of identity proxies may become obsolete. You no longer need to carry along keys, cards, or fobs when you can use yourself as proof of your identification. A future with a clearly defined society with frictionless transactions, interactions, and access control might be on the horizon. Don’t fall behind on the times! Stay ahead of the curve and ensure your access control systems are up to date with biometric identity authentication (Admin, 2021).

1.2 Statement of Problem

As the student population grows, so does the number of names on the attendance list. Keeping these attendance papers becomes a difficulty, and no suitable backup is made. Most colleges still employ the conventional method, mostly in lecture halls and laboratories. The teacher or lecturer will hand out a sheet of paper with a list of students' names to sign, or in certain situations, the students must fill it out with their name, student ID, and matriculation number to demonstrate their attendance in a specific class. In this manner, fabrication of the student attendance list is prevalent. Assume a student is absent, but another student can sign in their place. To avoid this problem, it is required to create a fingerprint authentication system for pupils. Biometric recognition will be utilized to track and maintain track of every student's attendance in a certain class.